ECE 6605 - Information Theory

Prof. Matthieu Bloch

Tuesday, October 31, 2023

Announcements

  • Assignment 2 still in grading

  • Assignment 3

    • Posted earlier today
    • Due November 9, 2023 on Gradescope
  • Last time

    • Channel coding theorem
  • Today

    • Wrap up channel coding theorem
    • Soft-covering
  • Later this week

    • Secrecy and secret-key generation!

Achievability for channel coding (random coding)

  • Consider a generic channel \((\calU,P_{V|U},\calV)\) with message set \(\{1,\cdots,M\}\)

  • For \(\gamma>0\), let

    \[\begin{align*} \calA_\gamma \eqdef \left\{(u,v)\in\calU\times\calV:\log\frac{P_{V|U}(v|u)}{P_V(v)}\geq\gamma\right\} \end{align*}\]

  • Encoder

    • For each \({m}\in\{1,\cdots,M\}\), \(f(m)\) is a codeword \(u\) drawn independently according to \(P_U\).
  • Decoder: \(g\) is defined as follows. Upon receiving a channel ouptut \(v\):

    • If there exists a unique \(\hat{m}\) such that \((f(\hat{m}),v)\in\calA_\gamma\), return \(m^*\);
    • Else, return a pre-defined arbitrary \(m_0\)

For any \(\gamma>0\),

\[\begin{align*} \E[C]{P_e(C)} \leq \P[P_UP_{V|U}]{(U,V)\notin \calA_\gamma} + M2^{-\gamma}. \end{align*}\]

Comments on channel coding result

  • Maximum vs. average probability of error
  • Additive white Gaussian Noise Channels
    1. \(Y_i = X_i + N_i\) where \(N_i\sim \calN(0,\sigma^2)\)
    2. Power constraints: codewords should be such that \(\frac{1}{n}\sum_{i=1}^nx_i^2\leq P\)
    3. What is the capacity?
  • What about continuous time channels?

Soft covering

Image
Soft covering coding model
  • We assume for now that the message is uniformly distributed and that the input reference distirbution \(P_X\) is known and fixed.

A \((n,M_n)\) code \(\calC\) for soft covering over a discrete memoryless source \(P_{Z|X}\) consists of an encoding function \(f_n:\{1,\cdots,M_n\}\to\calX^n\)

  • The performance of an \((n,M_n)\) code is measured in terms of
    1. The rate of the code \(R\eqdef \frac{\log_2 M_n}{n}\) (bits/source symbol)
    2. The approximation of the output statistics \[ S^{(n)}\eqdef \mathbb{V}(P_Z^n,P_Z^{\otimes n}) \]

Channel resolvability

  • Define \[ C_{\textsf{r}}\eqdef \inf\left\{R: \exists \set{(f_n,g_n)}_{n\geq 1} \textsf{ with }\lim_{n\to\infty}\frac{\log M_n}{n}\leq R \text{ and } \lim_{n\to\infty}D^{(n)}=0\right\} \]

For a discrete memoryless channel characterized by \(P_{Z|X}\) and an input \(P_X\), \(C_{\textsf{r}} = \mathbb{I}(X;Z)\)

  • Remarks
    • This result is called the channel coding theorem
    • Given a statistical characterization of a channel and an input distribution, we can randomize to lose structure
  • Proof of the result
    • We will use again an achievability and a converse

Achievability (random coding)

  • Consider a generic channel \((\calU,P_{V|U},\calV)\) with message set \(\{1,\cdots,M\}\)

  • For \(\gamma>0\), let

    \[\begin{align*} \calC_\gamma \eqdef \left\{(u,v)\in\calU\times\calV:\log\frac{P_{V|U}(v|u)}{P_V(v)}\leq\gamma\right\} \end{align*}\]

  • Encoder

    • For each \({m}\in\{1,\cdots,M\}\), \(f(m)\) is a codeword \(u\) drawn independently according to \(P_U\).

For any \(\gamma>0\),

\[\begin{align*} \E[C]{S(C)} \leq \P[P_UP_{V|U}]{(U,V)\notin \calC_\gamma} + \frac{1}{2}\sqrt{\frac{2^{\gamma}}{M}}. \end{align*}\]

Converse

Consider a sequence of codes \(\set{(f_n,g_n)}_{n\geq 1}\) such that \(\lim_{n\to\infty}S^{(n)}=0\) and \(\lim_{n\to\infty}\frac{\log M_n}{n}\geq R\). Then \[ R\geq \min_{P:P\circ P_{Z|X}=P_Z}\mathbb{I}(X;Z) \]